Joint L1/2-Norm Constraint and Graph-Laplacian PCA Method for Feature Extraction

نویسندگان

  • Chun-Mei Feng
  • Ying-Lian Gao
  • Jin-Xing Liu
  • Juan Wang
  • Dong-Qin Wang
  • Chang-Gang Wen
چکیده

Principal Component Analysis (PCA) as a tool for dimensionality reduction is widely used in many areas. In the area of bioinformatics, each involved variable corresponds to a specific gene. In order to improve the robustness of PCA-based method, this paper proposes a novel graph-Laplacian PCA algorithm by adopting L1/2 constraint (L1/2 gLPCA) on error function for feature (gene) extraction. The error function based on L1/2-norm helps to reduce the influence of outliers and noise. Augmented Lagrange Multipliers (ALM) method is applied to solve the subproblem. This method gets better results in feature extraction than other state-of-the-art PCA-based methods. Extensive experimental results on simulation data and gene expression data sets demonstrate that our method can get higher identification accuracies than others.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A collaborative representation based projections method for feature extraction

In graph embedding based methods, we usually need to manually choose the nearest neighbors and then compute the edge weights using the nearest neighbors via L2 norm (e.g. LLE). It is difficult and unstable to manually choose the nearest neighbors in high dimensional space. So how to automatically construct a graph is very important. In this paper, first, we give a L2-graph like L1-graph. L2-gra...

متن کامل

An efficient algorithm for L1-norm principal component analysis

Principal component analysis (PCA) (also called Karhunen Loève transform) has been widely used for dimensionality reduction, denoising, feature selection, subspace detection and other purposes. However, traditional PCA minimizes the sum of squared errors and suffers from both outliers and large feature noises. The L1-norm based PCA (more precisely L1,1 norm) is more robust. Yet, the optimizatio...

متن کامل

Robust Image Analysis by L1-Norm Semi-supervised Learning

This paper presents a novel L1-norm semisupervised learning algorithm for robust image analysis by giving new L1-norm formulation of Laplacian regularization which is the key step of graph-based semi-supervised learning. Since our L1-norm Laplacian regularization is defined directly over the eigenvectors of the normalized Laplacian matrix, we successfully formulate semi-supervised learning as a...

متن کامل

l2, 1 Regularized correntropy for robust feature selection

In this paper, we study the problem of robust feature extraction based on l2,1 regularized correntropy in both theoretical and algorithmic manner. In theoretical part, we point out that an l2,1-norm minimization can be justified from the viewpoint of half-quadratic (HQ) optimization, which facilitates convergence study and algorithmic development. In particular, a general formulation is accordi...

متن کامل

Feature Selection via Joint Embedding Learning and Sparse Regression

The problem of feature selection has aroused considerable research interests in the past few years. Traditional learning based feature selection methods separate embedding learning and feature ranking. In this paper, we introduce a novel unsupervised feature selection approach via Joint Embedding Learning and Sparse Regression (JELSR). Instead of simply employing the graph laplacian for embeddi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره 2017  شماره 

صفحات  -

تاریخ انتشار 2017